- Build evaluation datasets for performance monitoring.
- Identify and resolve LLM content issues effectively.
- Gather examples for advanced tasks like fine-tuning.
Provide feedback in the UI
In the Weave UI, you can add and view feedback from the call details page or using the icons.From the call details page
- In the sidebar, navigate to Traces.
- Find the row for the call that you want to add feedback to.
- Open the call details page.
- Select the Feedback column for the call.
- Add, view, or delete feedback:
- Add and view feedback using the icons located in the upper right corner of the call details feedback view.
- View and delete feedback from the call details feedback table. Delete feedback by clicking the trashcan icon in the rightmost column of the appropriate feedback row.

Use the icons
You can add or remove a reaction, and add a note using the icons that are located in both the call table and individual call details pages.- Call table: Located in Feedback column in the appropriate row in the call table.
- Call details page: Located in the upper right corner of each call details page.
- Click the emoji icon.
- Add a thumbs up, thumbs down, or click the + icon for more emojis.
- Hover over the emoji reaction you want to remove.
- Click the reaction to remove it.
You can also delete feedback from the Feedback column on the call details page..To add a comment:
- Click the comment bubble icon.
- In the text box, add your note.
- To save the note, press the Enter key. You can add additional notes.
The maximum number of characters in a feedback note is 1024. If a note exceeds this limit, it will not be created.

Provide feedback via the SDK
You can find SDK usage examples for feedback in the UI under the Use tab in the call details page.You can use the Weave SDK to programmatically add, remove, and query feedback on calls.
Query a project’s feedback
You can query the feedback for your Weave project using the SDK. The SDK supports the following feedback query operations:client.get_feedback()
: Returns all feedback in a project.client.get_feedback("<feedback_uuid>")
: Return a specific feedback object specified by<feedback_uuid>
as a collection.client.get_feedback(reaction="<reaction_type>")
: Returns all feedback objects for a specific reaction type.
client.get_feedback()
:
id
: The feedback object ID.created_at
: The creation time information for the feedback object.feedback_type
: The type of feedback (reaction, note, custom).payload
: The feedback payload
Add feedback to a call
You can add feedback to a call using the call’s UUID. To use the UUID to get a particular call, retrieve it during or after call execution. The SDK supports the following operations for adding feedback to a call:call.feedback.add_reaction("<reaction_type>")
: Add one of the supported<reaction_types>
(emojis), such as 👍.call.feedback.add_note("<note>")
: Add a note.call.feedback.add("<label>", <object>)
: Add a custom feedback<object>
specified by<label>
.
The maximum number of characters in a feedback note is 1024. If a note exceeds this limit, it will not be created.
Retrieve the call UUID
For scenarios where you need to add feedback immediately after a call, you can retrieve the call UUID programmatically during or after the call execution.During call execution
To retrieve the UUID during call execution, get the current call, and return the ID.After call execution
Alternatively, you can usecall()
method to execute the operation and retrieve the ID after call execution:
Delete feedback from a call
You can delete feedback from a particular call by specifying a UUID.Add human annotations
Human annotations are supported in the Weave UI. To make human annotations, you must first create a Human Annotation scorer using either the UI or the API. Then, you can use the scorer in the UI to make annotations, and modify your annotation scorers using the API.Create a human annotation scorer in the UI
To create a human annotation scorer in the UI, do the following:- In the sidebar, navigate to Scorers.
- In the upper right corner, click + Create scorer.
- In the configuration page, set:
Scorer type
toHuman annotation
Name
Description
Type
, which determines the type of feedback that will be collected, such asboolean
orinteger
.
- Click Create scorer. Now, you can use your scorer to make annotations.
Type
selected for the score configuration is an enum
containing the possible document types.

Use the human annotation scorer in the UI
Once you create a human annotation scorer, it will automatically display in the Feedback sidebar of the call details page with the configured options. To use the scorer, do the following:- In the sidebar, navigate to Traces
- Find the row for the call that you want to add a human annotation to.
- Open the call details page.
-
In the upper right corner, click the Show feedback button.
Your available human annotation scorers display in the sidebar.
- Make an annotation.
- Click Save.
-
In the call details page, click Feedback to view the calls table. The new annotation displays in the table. You can also view the annotations in the Annotations column in the call table in Traces.
Refresh the call table to view the most up-to-date information.

Create a human annotation scorer using the API
Human annotation scorers can also be created through the API. Each scorer is its own object, which is created and updated independently. To create a human annotation scorer programmatically, do the following:- Import the
AnnotationSpec
class fromweave.flow.annotation_spec
- Use the
publish
method fromweave
to create the scorer.
Temperature
, is used to score the perceived temperature of the LLM call. The second scorer, Tone
, is used to score the tone of the LLM response. Each scorer is created using save
with an associated object ID (temperature-scorer
and tone-scorer
).
Modify a human annotation scorer using the API
Expanding on creating a human annotation scorer using the API, the following example creates an updated version of theTemperature
scorer, by using the original object ID (temperature-scorer
) on publish
. The result is an updated object, with a history of all versions.
You can view human annotation scorer object history in the Scorers tab under Human annotations.

Use a human annotation scorer using the API
The feedback API allows you to use a human annotation scorer by specifying a specially constructed name and anannotation_ref
field. You can obtain the annotation_spec_ref
from the UI by selecting the appropriate tab, or during the creation of the AnnotationSpec
.